The Generalized Gradient at a Multiple Eigenvalue
نویسنده
چکیده
When a symmetric, positive, isomorphism between a reeexive Banach space (that is densely and compactly embedded in a Hilbert space) and its dual varies smoothly over a Banach space, its eigenvalues vary in a Lipschitz manner. We calculate the generalized gradient of the extreme eigenvalues at an arbitrary crossing. We apply this to the generalized gradient, with respect to a coeecient in an elliptic operator, of (i) the gap between the operator's rst two eigenvalues, and (ii) the distance from a prescribed value to the spectrum of the operator.
منابع مشابه
An eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملA New Inexact Inverse Subspace Iteration for Generalized Eigenvalue Problems
In this paper, we represent an inexact inverse subspace iteration method for computing a few eigenpairs of the generalized eigenvalue problem Ax = Bx [Q. Ye and P. Zhang, Inexact inverse subspace iteration for generalized eigenvalue problems, Linear Algebra and its Application, 434 (2011) 1697-1715 ]. In particular, the linear convergence property of the inverse subspace iteration is preserved.
متن کاملNearly optimal preconditioned methods for hermitian eigenproblems under limited memory . Part I : Seeking one eigenvalue Andreas Stathopoulos July 2005
Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective that helps us develop two nearly optim...
متن کاملMinimization principles and computation for the generalized linear response eigenvalue problem
The minimization principle and Cauchy-like interlacing inequalities for the generalized linear response eigenvalue problem are presented. Based on these theoretical results, the best approximations through structure-preserving subspace projection and a locally optimal block conjugate gradient-like algorithm for simultaneously computing the first few smallest eigenvalues with the positive sign a...
متن کامل